Least Square Variational Bayesian Autoencoder with Regularization

نویسنده

  • Gautam Ramachandra
چکیده

In recent years Variation Autoencoders have become one of the most popular unsupervised learning of complicated distributions. Variational Autoencoder (VAE) provides more efficient reconstructive performance over a traditional autoencoder. Variational auto enocders make better approximaiton than MCMC. The VAE defines a generative process in terms of ancestral sampling through a cascade of hidden stochastic layers. Variational autoencoder is trained to maximise the variational lower bound. Here we are trying maximise the likelihood and also at the same time we are trying to make a good approximation of the data. Its basically trading of the data log-likelihood and the KL divergence from the true posterior. This paper describes the scenario in which we wish to find a point-estimate to the parameters θ of some parametric model in which we generate each observations xi by first sampling a local latent variable zi ∼ pθ (z) and then sampling the associated observation xi ∼ pθ (x | z). Here we use least square loss function with regularization in the the reconstruction of the image, the least square loss function was found to give better reconstructed images and had a faster training time.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Variational Generative Stochastic Networks with Collaborative Shaping

We develop an approach to training generative models based on unrolling a variational autoencoder into a Markov chain, and shaping the chain’s trajectories using a technique inspired by recent work in Approximate Bayesian computation. We show that the global minimizer of the resulting objective is achieved when the generative model reproduces the target distribution. To allow finer control over...

متن کامل

An inexact alternating direction method with SQP regularization for the structured variational inequalities

In this paper, we propose an inexact alternating direction method with square quadratic proximal  (SQP) regularization for  the structured variational inequalities. The predictor is obtained via solving SQP system  approximately  under significantly  relaxed accuracy criterion  and the new iterate is computed directly by an explicit formula derived from the original SQP method. Under appropriat...

متن کامل

Shape optimization in laminar flow with a label-guided variational autoencoder

Computational design optimization in fluid dynamics usually requires to solve non-linear partial differential equations numerically. In this work, we explore a Bayesian optimization approach to minimize an object’s drag coefficient in laminar flow based on predicting drag directly from the object shape. Jointly training an architecture combining a variational autoencoder mapping shapes to laten...

متن کامل

Local Kernels that Approximate Bayesian Regularization and Proximal Operators

In this work, we broadly connect kernel-based filtering (e.g. approaches such as the bilateral filters and nonlocal means, but also many more) with general variational formulations of Bayesian regularized least squares, and the related concept of proximal operators. The latter set of variational/Bayesian/proximal formulations often result in optimization problems that do not have closed-form so...

متن کامل

Stick-breaking Variational Autoencoders

We extend Stochastic Gradient Variational Bayes to perform posterior inference for the weights of Stick-Breaking processes. This development allows us to define a Stick-Breaking Variational Autoencoder (SB-VAE), a Bayesian nonparametric version of the variational autoencoder that has a latent representation with stochastic dimensionality. We experimentally demonstrate that the SB-VAE, and a sem...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1707.03134  شماره 

صفحات  -

تاریخ انتشار 2017